# Task Generalization
Deepseek R1 Distill Llama 8B Abliterated
DeepSeek-R1-Distill-Llama-8B is a distilled large language model based on the Llama architecture, with a parameter scale of 8B, primarily designed for English text generation and comprehension tasks.
Large Language Model
Transformers English

D
stepenZEN
119
9
Agentlm 70b
AgentLM-70B is a large language model obtained by mixed training on the AgentInstruct dataset and the ShareGPT dataset based on the Llama-2-chat model, focusing on enhancing agent capabilities and general language abilities.
Large Language Model
Transformers

A
THUDM
23
82
Flan T5 Xl
Apache-2.0
FLAN-T5 XL is an instruction-finetuned language model based on the T5 architecture, with significantly improved multilingual and few-shot performance after fine-tuning on 1000+ tasks.
Large Language Model Supports Multiple Languages
F
google
257.40k
494
Featured Recommended AI Models